Sustained Space Complexity
نویسندگان
چکیده
Memory-hard functions (MHF) are functions whose evaluation cost is dominated by memory cost. MHFs are egalitarian, in the sense that evaluating them on dedicated hardware (like FPGAs or ASICs) is not much cheaper than on off-the-shelf hardware (like x86 CPUs). MHFs have interesting cryptographic applications, most notably to password hashing and securing blockchains. Alwen and Serbinenko [STOC’15] define the cumulative memory complexity (cmc) of a function as the sum (over all time-steps) of the amount of memory required to compute the function. They advocate that a good MHF must have high cmc. Unlike previous notions, cmc takes into account that dedicated hardware might exploit amortization and parallelism. Still, cmc has been critizised as insufficient, as it fails to capture possible time-memory trade-offs; as memory cost doesn’t scale linearly, functions with the same cmc could still have very different actual hardware cost. In this work we address this problem, and introduce the notion of sustained-memory complexity, which requires that any algorithm evaluating the function must use a large amount of memory for many steps. We construct functions (in the parallel random oracle model) whose sustained-memory complexity is almost optimal: our function can be evaluated using n steps and O(n/ log(n)) memory, in each step making one query to the (fixed-input length) random oracle, while any algorithm that can make arbitrary many parallel queries to the random oracle, still needs Ω(n/ log(n)) memory for Ω(n) steps. As has been done for various notions (including cmc) before, we reduce the task of constructing an MHFs with high sustained-memory complexity to proving pebbling lower bounds on DAGs. Our main technical contribution is the construction is a family of DAGs on n nodes with constant indegree with high “sustained-space complexity”, meaning that any parallel black-pebbling strategy requires Ω(n/ log(n)) pebbles for at least Ω(n) steps. Along the way we construct a family of maximally “depth-robust” DAGs with maximum indegree O(log n), improving upon the construction of Mahmoody et al. [ITCS’13] which had maximum indegree O ( log n · polylog(logn) )
منابع مشابه
Time and Space Complexity Reduction of a Cryptanalysis Algorithm
Binary Decision Diagram (in short BDD) is an efficient data structure which has been used widely in computer science and engineering. BDD-based attack in key stream cryptanalysis is one of the best forms of attack in its category. In this paper, we propose a new key stream attack which is based on ZDD(Zero-suppressed BDD). We show how a ZDD-based key stream attack is more efficient in time and ...
متن کاملTime and Space Complexity Reduction of a Cryptanalysis Algorithm
Binary Decision Diagram (in short BDD) is an efficient data structure which has been used widely in computer science and engineering. BDD-based attack in key stream cryptanalysis is one of the best forms of attack in its category. In this paper, we propose a new key stream attack which is based on ZDD(Zero-suppressed BDD). We show how a ZDD-based key stream attack is more efficient in time and ...
متن کاملReduction of Computational Complexity in Finite State Automata Explosion of Networked System Diagnosis (RESEARCH NOTE)
This research puts forward rough finite state automata which have been represented by two variants of BDD called ROBDD and ZBDD. The proposed structures have been used in networked system diagnosis and can overcome cominatorial explosion. In implementation the CUDD - Colorado University Decision Diagrams package is used. A mathematical proof for claimed complexity are provided which shows ZBDD ...
متن کاملIt Takes Two to Tango: Customization and Standardization as Colluding Logics in Healthcare; Comment on “(Re) Making the Procrustean Bed Standardization and Customization as Competing Logics in Healthcare”
The healthcare context is characterized with new developments, technologies, ideas and expectations that are continually reshaping the frontline of care delivery. Mannion and Exworthy identify two key factors driving this complexity, ‘standardization’ and ‘customization,’ and their apparent resulting paradox to be negotiated by healthcare professionals, managers and policy makers. However, whil...
متن کاملSelf-sustained oscillator as a model for explosion quakes at Stromboli Volcano
We analyze seismic signals produced by explosion-quakes at Stromboli Volcano. We use standard nonlinear procedures to search a low-order effective dynamics. The dimension of the reconstructed phase space depends on the number of samples. Namely larger time lengths correspond to dynamical systems of different complexity. If we restrict the analysis to the signal associated directly to the source...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IACR Cryptology ePrint Archive
دوره 2018 شماره
صفحات -
تاریخ انتشار 2018